Web Survey Bibliography
A popular example for this is the ‘forced response‘ option, whose impact will be analysed within this research project. The ‘forced response’ option is commonly described as a possibility to force the respondent to give an answer to each question that is asked. In most of the online survey computer software, it is easily achieved by enabling a checkbox.
Relevance: There has been a tremendous increase in the use of this option, however, the inquirers are often not aware of the possible consequences. In software manuals, this option is praised as a strategy that significantly reduces item non-response. In contrast, research studies offer many doubts that counter this strategy (Kaczmirek 2005, Peytchev/Crawford 2005, Dillman/Smyth/Christian 2009, Schnell/Hill/Esser 2011, Jacob/Heinz/Décieux 2013). They are based on the assumption that respondents typically have plausible reasons for not answering a question (such as not understanding the question; absence of an appropriate category; personal reasons e.g. privacy). Research Question: Our thesis is that forcing the respondents to select an answer might cause two scenarios: Increasing unit non-response (increased dropout rates); Decreasing validity of the answers (lying or random answers). Methods and Data: To analyse the consequences of the implementation of ‘forced response’ option, we use split ballot field experiments. Our analysis focuses especially on dropout rates and response behaviour. Our first split ballot experiment was carried out in July 2014 (n=1056) and we have planned a second experiment for February 2015, so that we will be able to present our results based on strong data evidence. First results:If the respondents are forced to answer each question, they will - cancel the study earlier and- choose more often the response category “No” (in terms of sensitive issues).
Web survey bibliography - 2015 (291)
- Adding Postal Follow-Up to a Web-Based Survey of Primary Care and Gastroenterology Clinic Physician...; 2015; Partin, M. R.; Powell, A. A.; Burgess, D. J.; Haggstrom, D. A.; Gravely, A. A.; Halek, K.; Bangerter...
- Can Non-full-probability Internet Surveys Yield Useful Data? A Comparison with Full-probability Face...; 2015; Simmons, A.D.; Bobo, L. D.
- Participation rates, response bias and response behaviours in the community survey of the Swiss Spinal...; 2015; Fekete, C.; Segerer, W.; Gemperli, A.; Brinkhof, M.W.G.
- The Cathie Marsh lecture: What does the failure of the polls tell us about the future of survey research...; 2015; Sturgis, P., Matheson, J.
- GreenBook Research Industry Trends Report; 2015; Murphy, L. (Ed.)
- Designing web surveys for the multi-device internet; 2015; de Bruijne, M.
- Data Quality Standards in Mixed Mode Surveys; 2015; Bremer, J.; Barbulescu, M.; Bennett, J.
- Mobile Surveys for Kids: Making Surveys G-rated; 2015; Simpson, B.
- Changing from CAPI to CAWI in an ongoing household panel - experiences from the German Socio-Economic...; 2015; Schupp, J.; Sassenroth, D.
- Rating Scales in Web Surveys: A Test of New Drag-and-Drop Rating Procedures; 2015; Kunz, T.
- A Review of Issues in Gamified Surveys; 2015; Keusch, F.; Zhang, Che.
- Mobility Enabled: Effects of Mobile Devices on Survey Response and Substantive Measures; 2015; Barlas, F. M.; Randall, T. K.
- Innovations in Email Invitation Design for Today’s Digital World; 2015; Saunders, T.; Kessler, A.
- Gamification in Survey Research: Do The Results Support The Evangelists?; 2015; Pashupati, K.; Weber-Raley, L.
- What They Can’t See Can Hurt You: Improving Grids for Mobile Devices; 2015; Randall, T. K.; Barlas, F. M.; Graham, P.; Subias, T.
- Same, Same but Different: Effects of mixing Web and mail modes in audience research; 2015; Bergstroem, A.
- Comparison of telephone RDD and online panel survey modes on CPGI scores and co-morbidities; 2015; Lee, C.-K.; Back, K.-J.; Williams, Ro. J.; Ahn, S.-S.
- An Empirical Test of Nonresponse Bias in Internet Surveys; 2015; af Wahlberg, AE; Poom, L.
- Hidden Populations, Online Purposive Sampling, and External Validity: Taking off the Blindfold; 2015; Barrat, M. J.; Ferris, J. A.; Lenton, S.
- An experiment testing six formats of 101-point rating scales; 2015; Liu, M.; Conrad, F. G.
- Emerging Technologies: The Rise of Mobile Devices: From Smartphones to Smart Surveys; 2015; Buskirk, T. D.
- A Free Audio-CASI Module for LimeSurvey; 2015; Beier, H.; Schulz, S.
- Self-identification of occupation in web surveys: requirements for search trees and look-up tables ; 2015; Tijdens, K. G.
- Mode System Effects in an Online Panel Study: Comparing a Probability-based Online Panel with two Face...; 2015; Struminskaya, B.; De Leeuw, E. D.; Kaczmirek, L.
- Using Incentives and Multiple Modes of Data Collection to Improve Response Rate: Results from the National...; 2015; Howden, L. M.; Joestl, S. S.; Cohen, R. A.
- Utilizing iPads in the Field; 2015; Kiser, P.
- Mixed Mode Design Considerations; 2015; Hupp, A.
- Enhancing Response Usability in a Web-based Survey, But Did Anyone Use It?; 2015; Yoder, R.
- Mixed Mode Design Considerations: Panel Discussion; 2015; Smyth, J.; Hupp, A.; Elver, K.
- PayPal? An Incentive to Check-out?; 2015; Franklin, J.; Rasmussen, C.; Pruitt, J.; Waller, D.
- Standard Definitions: Final Dispositions of Case Codes and Outcome Rates for Surveys 2015; 2015
- Mixed mode surveys ; 2015; Burton, J.
- The Evolution of CAPI; 2015; Johnson, A. J.
- Online Questionnaire for Survey Research: Comparative Study of the Features Available with Free Account...; 2015; Subhash, K.
- An Experiment in Open End Response Length in Relation to Text Box Length in a Web Survey; 2015; Traugott, M. W.; Antoun, C.
- Item nonresponse in open-ended questions: Identification and reduction in web surveys; 2015; Kaczmirek, L.; Behr, D.
- Metrics and Design Tool for Building and Evaluating Probability-Based Online Panels; 2015; DiSogra, C.; Callegaro, M.
- Impact of raising awareness of respondents on the measurement quality in a web survey; 2015; Revilla, M.
- Analysis of four recruitment methods for obtaining normative data through a Web-based questionnaire:...; 2015; Nolte, M. T.; Shauver, M. J.; Chung, K. C.
- Open narrative questions in PC and smartphones: is the device playing a role?; 2015; Revilla, M.; Ochoa, C.
- Gamification of Online Surveys: Design Process, Case Study, and Evaluation; 2015; Harms, J.; Biegler, S.; Wimmer, C.; Kappel, K.; Grechenig, T.
- Equivalency of Paper Versus Tablet Computer Survey Data; 2015; Ravert, R. D.; Gomez-Scott, J.; Donnellan, M. B.
- Higher response rates at the expense of validity? Consequences of the implementation of the ‘forced...; 2015; Decieux, J. P.; Mergener, A.; Neufang, K.; Sischka, P.
- Development and Validation of a Scale for Social Exhibitionism on the Internet (SEXI); 2015; Vetter, M.; Eib, C.; Hill-Kloss, S.; Wollscheid, P.; Hagemann, D.
- Comparison of self-administered survey questionnaire responses collected using mobile apps versus other...; 2015; Belisario, J. S. M.; Jamsek, J.; Huckvale, K.; O'Donoghue, J.; Morrison, C. P.; Car, J.
- Evaluating the Distorting Effects of Inattentive Responding and Social Desirability on Self-Report Scales...; 2015; McKibben, W. B.; Silvia, P. J.
- A quasi-experiment on effects of prepaid versus promised incentives on participation in a probability...; 2015; Schaurer, I.; Bosnjak, M.
- Surveys: Question Wording and Response Categories; 2015; Schaeffer, N. C.; Dykema, J.
- Doing online research involving university students with disabilities: Methodological issues; 2015; De Cesarei, A.; Baldaro, B.
- Understanding Society Innovation Panel Wave 7: Results from Methodological Experiments; 2015; Blom, A. G.; Burton, J.; Booker, C. L.; Cernat, A.; Fairbrother, M.; Jaeckle, A.; Kaminska, O.; Keusch...